首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   4634篇
  免费   234篇
  国内免费   10篇
电工技术   56篇
综合类   3篇
化学工业   948篇
金属工艺   94篇
机械仪表   76篇
建筑科学   278篇
矿业工程   4篇
能源动力   194篇
轻工业   489篇
水利工程   30篇
石油天然气   8篇
无线电   405篇
一般工业技术   867篇
冶金工业   425篇
原子能技术   34篇
自动化技术   967篇
  2023年   43篇
  2022年   22篇
  2021年   125篇
  2020年   86篇
  2019年   104篇
  2018年   113篇
  2017年   107篇
  2016年   147篇
  2015年   131篇
  2014年   169篇
  2013年   298篇
  2012年   303篇
  2011年   360篇
  2010年   307篇
  2009年   282篇
  2008年   289篇
  2007年   254篇
  2006年   228篇
  2005年   181篇
  2004年   162篇
  2003年   143篇
  2002年   130篇
  2001年   60篇
  2000年   82篇
  1999年   55篇
  1998年   70篇
  1997年   54篇
  1996年   49篇
  1995年   43篇
  1994年   49篇
  1993年   35篇
  1992年   27篇
  1991年   27篇
  1990年   25篇
  1989年   30篇
  1988年   16篇
  1987年   25篇
  1986年   21篇
  1985年   23篇
  1984年   23篇
  1983年   19篇
  1982年   24篇
  1981年   17篇
  1980年   14篇
  1979年   20篇
  1978年   13篇
  1977年   16篇
  1976年   11篇
  1973年   9篇
  1971年   8篇
排序方式: 共有4878条查询结果,搜索用时 0 毫秒
91.
92.
It is shown that several recursive least squares (RLS) type equalization algorithms such as, e.g., decisiondirected schemes and orthogonalized constant modulus algorithms, possess a common algorithmic structure and are therefore rather straightforwardly implemented on an triangular array (filter structure) for RLS estimation with inverse updating. While the computational complexity for such algorithms isO(N 2), whereN is the problem size, the throughput rate for the array implementation isO(1), i.e., independent of the problem size. Such a throughput rate cannot be achieved with standard (Gentleman-Kung-type) RLS/QR-updating arrays because of feedback loops in the computational schemes.  相似文献   
93.
We demonstrate controlled transport of superparamagnetic beads in the opposite direction of a laminar flow. A permanent magnet assembles 200 nm magnetic particles into about 200 μm long bead chains that are aligned in parallel to the magnetic field lines. Due to a magnetic field gradient, the bead chains are attracted towards the wall of a microfluidic channel. A rotation of the permanent magnet results in a rotation of the bead chains in the opposite direction to the magnet. Due to friction on the surface, the bead chains roll along the channel wall, even in counter-flow direction, up to at a maximum counter-flow velocity of 8 mm s−1. Based on this approach, magnetic beads can be accurately manoeuvred within microfluidic channels. This counter-flow motion can be efficiently be used in Lab-on-a-Chip systems, e.g. for implementing washing steps in DNA purification.  相似文献   
94.
A procedure to find the optimal design of a flywheel with a split-type hub is presented. Since cost plays a decisive role in stationary flywheel energy storage applications, a trade-off between energy and cost is required. Applying a scaling technique, the multi-objective design problem is reduced to the maximization of the energy-per-cost ratio as the single objective. Both an analytical and a finite element model were studied. The latter was found to be more than three orders of magnitude more computationally expensive than the analytical model, while the analytical model can only be regarded as a coarse approximation. Multifidelity approaches were examined to reduce the computational expense while retaining the high accuracy and large modeling depth of the finite element model. Using a surrogate-based optimization strategy, the computational cost was reduced to only one third in comparison to using only the finite element model. A nonlinear interior-point method was employed to find the optimal rim thicknesses and rotational speed. The benefits of the split-type hub architecture were demonstrated.  相似文献   
95.
During the past few years, several works have been done to derive string kernels from probability distributions. For instance, the Fisher kernel uses a generative model M (e.g. a hidden Markov model) and compares two strings according to how they are generated by M. On the other hand, the marginalized kernels allow the computation of the joint similarity between two instances by summing conditional probabilities. In this paper, we adapt this approach to edit distance-based conditional distributions and we present a way to learn a new string edit kernel. We show that the practical computation of such a kernel between two strings x and x built from an alphabet Σ requires (i) to learn edit probabilities in the form of the parameters of a stochastic state machine and (ii) to calculate an infinite sum over Σ* by resorting to the intersection of probabilistic automata as done for rational kernels. We show on a handwritten character recognition task that our new kernel outperforms not only the state of the art string kernels and string edit kernels but also the standard edit distance used by a neighborhood-based classifier.  相似文献   
96.
The SHARC framework for data quality in Web archiving   总被引:1,自引:0,他引:1  
Web archives preserve the history of born-digital content and offer great potential for sociologists, business analysts, and legal experts on intellectual property and compliance issues. Data quality is crucial for these purposes. Ideally, crawlers should gather coherent captures of entire Web sites, but the politeness etiquette and completeness requirement mandate very slow, long-duration crawling while Web sites undergo changes. This paper presents the SHARC framework for assessing the data quality in Web archives and for tuning capturing strategies toward better quality with given resources. We define data quality measures, characterize their properties, and develop a suite of quality-conscious scheduling strategies for archive crawling. Our framework includes single-visit and visit?Crevisit crawls. Single-visit crawls download every page of a site exactly once in an order that aims to minimize the ??blur?? in capturing the site. Visit?Crevisit strategies revisit pages after their initial downloads to check for intermediate changes. The revisiting order aims to maximize the ??coherence?? of the site capture(number pages that did not change during the capture). The quality notions of blur and coherence are formalized in the paper. Blur is a stochastic notion that reflects the expected number of page changes that a time-travel access to a site capture would accidentally see, instead of the ideal view of a instantaneously captured, ??sharp?? site. Coherence is a deterministic quality measure that counts the number of unchanged and thus coherently captured pages in a site snapshot. Strategies that aim to either minimize blur or maximize coherence are based on prior knowledge of or predictions for the change rates of individual pages. Our framework includes fairly accurate classifiers for change predictions. All strategies are fully implemented in a testbed and shown to be effective by experiments with both synthetically generated sites and a periodic crawl series for different Web sites.  相似文献   
97.
Enterprise Architecture (EA) is increasingly being used by large organizations to get a grip on the complexity of their business processes, information systems and technical infrastructure. Although seen as an important instrument to help solve major organizational problems, effectively applying EA seems no easy task. Active participation of EA stakeholders is one of the main critical success factors for EA. This participation depends on the degree in which EA helps stakeholders achieve their individual goals. A highly related topic is effectiveness of EA, the degree in which EA helps to achieve the collective goals of the organization. In this article we present our work regarding EA stakeholder satisfaction and EA effectiveness, and compare these two topics. We found that, regarding EA, the individual goals of stakeholders map quite well onto the collective goals of the organization. In a case study we conducted, we found that the organization is primarily concerned with the final results of EA, while individual stakeholders also worry about the way the architects operate.  相似文献   
98.
99.
The present paper deals with the modeling of wind turbine generation systems. The model of a doubly fed induction generator, along with the corresponding converter, crow bar protection and electrical grid is described. The different level control strategies both in normal operation and under voltage dig conditions are discussed, including speed control, torque and reactive power control for the rotor-side converter, reactive and DC voltage control for the grid-side converter and the corresponding current loops control. The results obtained with simulations are compared to experimental data obtained from voltage sags provoked to real wind turbines.  相似文献   
100.
A computational framework for fuel cell analysis and optimization is presented as an innovative alternative to the time consuming trial-and-error process currently used for fuel cell design. The framework is based on a two-dimensional through-the-channel isothermal, isobaric and single phase membrane electrode assembly (MEA) model. The model input parameters are the manufacturing parameters used to build the MEA: platinum loading, platinum to carbon ratio, electrolyte content and gas diffusion layer porosity. The governing equations of the fuel cell model are solved using Netwon’s algorithm and an adaptive finite element method in order to achieve near quadratic convergence and a mesh independent solution respectively. The analysis module is used to solve the optimization problem of finding the optimal MEA composition for maximizing performance. To solve the optimization problem a gradient-based optimization algorithm is used in conjunction with analytical sensitivities. By using a gradient-based method and analytical sensitivities, the framework presented is capable of solving a complete MEA optimization problem with state-of-the-art electrode models in approximately 30 min, making it a viable alternative for solving large-scale fuel cell problems.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号